Descent via Koszul extensions

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Penalized Bregman Divergence Estimation via Coordinate Descent

Variable selection via penalized estimation is appealing for dimension reduction. For penalized linear regression, Efron, et al. (2004) introduced the LARS algorithm. Recently, the coordinate descent (CD) algorithm was developed by Friedman, et al. (2007) for penalized linear regression and penalized logistic regression and was shown to gain computational superiority. This paper explores...

متن کامل

Extensions of the Hestenes-Stiefel and Polak-Ribiere-Polyak conjugate gradient methods with sufficient descent property

Using search directions of a recent class of three--term conjugate gradient methods, modified versions of the Hestenes-Stiefel and Polak-Ribiere-Polyak methods are proposed which satisfy the sufficient descent condition. The methods are shown to be globally convergent when the line search fulfills the (strong) Wolfe conditions. Numerical experiments are done on a set of CUTEr unconstrained opti...

متن کامل

Weil descent attack for Kummer extensions

In this paper, we show how the Weil descent attack of Gaudry, Hess and Smart can be adapted to work for some hyperelliptic curves defined over fields of odd characteristic. This attack applies to a family of hyperelliptic and superelliptic curves over quadratic field extensions, as well as two families of hyperelliptic curves defined over cubic extensions. We also show that those are the only f...

متن کامل

Dimensions of Triangulated Categories via Koszul Objects

Lower bounds for the dimension of a triangulated category are provided. These bounds are applied to stable derived categories of Artin algebras and of commutative complete intersection local rings. As a consequence, one obtains bounds for the representation dimensions of certain Artin algebras.

متن کامل

Learning ReLUs via Gradient Descent

In this paper we study the problem of learning Rectified Linear Units (ReLUs) which are functions of the form x ↦ max(0, ⟨w,x⟩) with w ∈ R denoting the weight vector. We study this problem in the high-dimensional regime where the number of observations are fewer than the dimension of the weight vector. We assume that the weight vector belongs to some closed set (convex or nonconvex) which captu...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Algebra

سال: 2009

ISSN: 0021-8693

DOI: 10.1016/j.jalgebra.2008.03.007